Prototyping The NkGraphics Tutorials

In this section, we will discover what the editor is capable of. For that, we will reproduce what the tutorials within nkGraphics end up with, in the tutorial 07 about composition.

This will let us witness how transparently the API can be driven from the UI, and how easily we will be able to do that. At the end of this tutorial, we will reach a rendering comparable to :

Final image

Ready to start ? Then let's go !

Launch

The editor is available in the Bin folder of the release archive. Once launched, you will be greeted by something like :

The editor
The editor !

A quick tour :

And that goes for the initial interface tour. From there, it is possible to access everything the editor has to offer.

Toying with resources

Let's start by loading the basic resources we need :

Loading the mesh

Let's load the mesh. Open the Resources menu, and select the Meshes item :

Select the mesh item
Resources menu, Meshes item

This will make another window pop :

Mesh interface
This is the meshes resources manipulation window

This design is used for resource windows in general :

For the meshes, in the version on which this tutorial is based, you can rename them, save their declaration files, change their source data, and witness / alter their attributes. Let's first create a resource. For that, click on the little "+" button you will find in the bottom left part of the window, under the resource list :

Mesh interface +
This little button is the way to create resources

This will open a sub window :

Mesh creation
Resource creation window

Once more, you will find this window for each resource type available. On the top, you can specify a declaration file to load from to create the resource(s). The bottom part allows you to create a resource from scratch by specifying its name. Let's do that, by entering its name "Sphere", and hitting enter or the button next to the field. It is now in the resource list and if we click on it :

Mesh created
Our mesh in the interface

As you can see, the interface is now filled with what the mesh has to offer to us. It is named "Sphere", has no data path set yet, and its data has only the vertex position. While it can be surprising, this is true in this context : our mesh can be used right away, albeit the fact it is empty. But that would not be very fun to see, so let's search for a file to load. Hit the "..." button within the Source group, and search for "sphere.obj", within the Data/Meshes folder of the release.

Mesh source button
Load the source mesh from this button

Once done, the interface will be updated :

Mesh loaded
New information

The Source data path is now updated. Also, we witness new attributes provided by the mesh : positions, normals, uvs. This is what the mesh has available in its data, and it has been detected by the editor. API wise, what we did is create a mesh, and set its resource path. The loading is managed by the editor itself.

Adding it to the scene tree

Next step will be to add it to the scene. For that, we can close the mesh window and go back to the main window. Then, right click within the Scene Tree list :

Alter the scene tree
Right clicking opens a new menu

Altering the scene tree goes through right clicking on items within the list. If you click on "Add Node", you will get a new item in the list, Node_0.
Right clicking on this new item will hold new possibilities. Select "Add Renderable".
On this new "Entity" item right click again to "Append Mesh".
In the list window popping, select the mesh, "Sphere", and hit ok.

The scene tree list should look like :

Scene tree final state
Final tree state we should expect

This hides, if we were doing it through the API, setting an entity in the render queue, adding a mesh in its sub entity, and pluging it to a node. And believe it or not, but the sphere is actually rendered in the 3D Graphics window !

The input pattern within the scene

If you read the nkGraphics tutorial series, you probably remember that the sphere is centered around 0, which is exactly where our camera starts. While our reflex in nkGraphics was to move the node, in the editor, no need for that. We will move the camera rather. We will use the input pattern available in the editor :

This camera controller scheme is quite similar to the way you would control a free camera within a First Person Shooter game. So, to move our camera away from the sphere, we need to :

You should see something similar to :

Visible sphere
The sphere as seen originally

By then, using the sphere as a visual reference, feel free to test a bit the controller and how it feels. Once ready, let's proceed !

Working with the texture

Keep the mesh creation process in mind, as it illustrates how resources are created in the editor. With that, it should be fairly easy to create the texture.
Open the window, through the menu bar : Resources > Textures.
Click the "+" button, specify the name, and select it within the list.
Next thing we want to do is load the cubemap. For that, click on the "Open File..." button in the bottom right and search for the Data/Textures/YokohamaNight.dds texture.

Loading a texture source
Open a texture file

The button to use for a texture source
Select the bottom right button to search for the texture

Source file loaded
Once loaded, the tools panel should update

Once done, you should be able to see that the texture type has been updated to TEX_CUBEMAP. Its dimensions and format are also deducted. The texture is ready to be used !

The API process hidden is creating, setting the resource path, and loading.

Creating the programs

Like any other resources, we need to create it through the dedicated "Shader Programs" window.
Close the Texture window. In the menu bar : Resources > Programs.
Let's create 2 programs, one for the sphere and one for the environment behind. For each of them, click on the "+" button under the list, and name them as wanted.
We will first work with the one for the sphere. Select the program we will work on.

Upon selection, the program will be flagged by default as a "Pipeline" program, with default simple HLSL sources. A program in the editor needs to be of a certain type, depending on the usage you will have for it :

Now, we want to assign this shader to an object, so let's keep it as a pipeline program. In each group, you will notice the program stages that can be written (vertex, domain...). We will need the vertex and pixel stages. Let's edit the vertex stage, by clicking the "Edit" button next to it :

Editing a program
Click the edit button next to the vertex stage for a pipeline program

A new window will pop, with a default shader sources prefilled :

Editing a program sources
Source edition window

The menu bar specifies the name of the program being edited.
On the top, each tab corresponds to a program stage. When switching, you will find the current active source for given stage.
The macros group allow you to specify macros to the program compilation.
The sources group speaks for itself, and holds the sources of the program.
The "Nothing to show." part is the compilation log window, notifying potential errors while recompiling using the associated button, "Recompile" in the bottom right.

Let's edit the sources of our vertex stage to match :

cbuffer PassBuffer : register(b0) { matrix view ; matrix proj ; float3 camPos ; } struct VertexInput { float4 position : POSITION ; float3 normal : NORMAL ; matrix world : WORLDMAT ; } ; struct PixelInput { float4 position : SV_POSITION ; float3 normal : NORMAL ; float3 camDir : CAMDIR ; } ; PixelInput main (VertexInput input) { PixelInput result ; matrix mvp = mul(input.world, mul(view, proj)) ; result.position = mul(input.position, mvp) ; result.normal = input.normal ; result.camDir = normalize(mul(input.position, input.world) - camPos) ; return result ; }

Now switch to the pixel stage tab, and change its sources to :

struct PixelInput { float4 position : SV_POSITION ; float3 normal : NORMAL ; float3 camDir : CAMDIR ; } ; TextureCube tex : register(t0) ; SamplerState customSampler : register(s0) ; float4 main (PixelInput input) : SV_TARGET { float3 sampleDir = reflect(input.camDir, normalize(input.normal)) ; return tex.Sample(customSampler, sampleDir) ; }

And hit the recompile button. The compilation should run fine, as reported by the log window :

Compilation result
Successful recompilation of the program !

On this part, the editor does a lot for us, by prefilling information and ensuring the program is in a good usable state.
However, even with that, you can maybe recognize the API : create the program, update its source from memory, and load.

Creating the shader

For the program to be used, we need the shader. Let's create it through its dedicated interface, with the usual process.
Close the program window, and hit Resources > Shaders.
Now hit "+" in the existing resources group, specify the name, and select it.
You will have a window like this :

Shader window
We created the shader and selected it

The shader window can seem intimidating at first, but it is axed towards two aspects :

With that in mind, the interface should be clearer. On the top, set the program. The type will be deduced from the program type, as a reminder.
Then, in the tab window in the bottom, this is where you will find the information about what the shader feeds to the program.
It is organized by resource type :

First step, then, is to set the program to use.
Click on the "..." button in the program group, and select the program it should use :

Shader program choice
Choose the program through the dedicated button

Now, we need to specify which resources to feed it.
First, ensure the Textures tab is active.
Process to add a resource is as such : through the sub-window, add a resource using the "+" button. It will be added with a default value.
Click on it, and change all parameters needed to make it feed what we need.
For instance, for our texture, add it, click on it, and click the "..." button after the Texture label. Select the environment map, and validate :

Shader parameterized
Add the texture, select the new entry, and change the source

Which should result in something like :

Shader parameterized
Shader now feeds the texture we want

Then, go to the Samplers tab and add one sampler. The default one will be sufficient, so no need to edit anything.

Next we need to add a constant buffer. Switch to the Constant Buffers tab :

Constant buffer base
Constant buffer interface

As with every other resources, click the "+" in the Constant Buffers group to add a buffer. Select it in the list, which will update the rest of the interface :

Add a buffer
Add a buffer through the dedicated tab

A freshly created buffer has no pass slot, so we need to add some from its dedicated list. So, first, add the slot through the "+" button. Select the slot created, which is by default a vector, and alter the types to fit the HLSL sources :

Set parameters for pass slots
Pass slot setup steps

The type names are directly mapped to the enumeration values within nkGraphics.
With the type set, we don't need to alter any parameter in the slots.
Repeat the process to fit what the cbuffer expects within the program. In the end, you should get something like :

Final constant buffer setup
Constant buffer aligned with the HLSL version

The order is important as the resource is fed sequentially, from top to bottom.
We need a VIEW_MATRIX, a PROJ_MATRIX, and a CAM_POSITION slots.
Finally, go into the Instance Buffers tab and add a slot, which will automatically be the World Matrix we need :

Add an instance slot
Add an instance slot within the dedicated tab

Once this is added, our shader is using the program and providing all the input it needs. We can go forward and use the shader !

Shader assignment

Now we need to use the shader with the mesh currently shown. You probably saw it passing by, what we now need is to set the shader via the scene tree. Close the shader window, switch to the main window, right click on the Entity, and assign the pipeline shader we just created through the list :

Assign a shader through the scene tree
Shader assignment through the scene tree

The shader we need to alter is the "pipeline" one.
It is the shader used for a scene rendering pass using rasterization, which is what is currently done. The raytracing shader is the one used during a raytracing pass, which is something we will discover later.
The rendering in the 3D window should now be drastically different :

Reflections on the sphere
Our shader is now used to render the sphere

Creating the post processing shader

What is left before altering the composition is to create and setup the shader and program for the background. First, reopen the program window (Resources > Programs).
Open the window, create a new program (click on "+" and select it in the list), leave it as a pipeline program and open the sources for the vertex stage. Replace them by :

cbuffer constants { float4 camDir [4] ; } struct VertexInput { float4 position : POSITION ; uint vertexId : SV_VertexID ; } ; struct PixelInput { float4 position : SV_POSITION ; float4 camDir : CAMDIR ; } ; PixelInput main (VertexInput input) { PixelInput result ; result.position = input.position ; result.camDir = camDir[input.vertexId] ; return result ; }

Switch to the pixel stage tab, and replace the sources by :

struct PixelInput { float4 position : SV_POSITION ; float4 camDir : CAMDIR ; } ; TextureCube envMap : register(t0) ; SamplerState customSampler : register(s0) ; float4 main (PixelInput input) : SV_TARGET { return envMap.Sample(customSampler, normalize(input.camDir)) ; }

We won't go over it again here, please consult the tutorial about composition (07) within nkGraphics for that. Just know that the heart of it is using the post processing pass particularities to easily deduce a direction in the world for a screen pixel, and sample the environment map from there. Hit the recompile button, and ensure everything is done without any problem.

It is time to create the attached shader. Close the program window, and search for Resources > Shaders in the main window menu bar.
In the window, create a new shader ("+" and select in the list), set its program ("..." in the program group) to the one just recompiled.
Add the texture (Textures tab, "+" in the texture list, select it, and "..." to set its source to the environment map).
Add the sampler (Samplers tab, "+" in the sampler list) and leave the default one as active.
Add the constant buffer and its slots (Constant Buffers tab, "+" in the buffer list, select it, add a slot through "+", and change its type in the settings).
The pass slot type required is the camera direction in world space (CAM_DIRECTION_WORLD).

With this, the shader is ready ! What is left is creating the compositor, and using it for rendering.

Creating the compositor

Compositors are like any other resource, and we need to open its window to manipulate them. This window might seem more complex on first sight, but it directly maps over what the API offers. As such, if you know the API, you should be able to understand what it is about and manipulate it right away. If not, let's see the window in detail :

Compositor Window
Compositor window

The window presents all aspects of a compositor, presented in one window. It reads from up left to down right.

Left is the list of compositors, as for every resources. Then comes the name control, and the heart of the compositor API. From there, you can add a node to the list. Upon selection, the node settings, activity and target operations, become visible in the next group box. Adding a target operation and selecting it leads to the next group, from which you can set the targets, viewport, and passes. Adding a pass and selecting it leads to the next group again, at the start of next line. Its type can be changed, and depending on its type, adapted settings can be tweaked within the last group on the right.

Once a compositor fits your need, its declaration can be saved through the dedicated group button.

From the UI, we need to reproduce the compositor done within the tutorial of nkGraphics. The process will be as such :

Populate a compositor
Process to setup a compositor

The process is very close to the API :

  1. Create a compositor
  2. Select it in its list
  3. Add a node in dedicated group
  4. Select it in its list
  5. Add a target operation
  6. Select it in its list
  7. Add a color target, select the back buffer in the choice list popping (NILKINS_BACK_BUFFER)
  8. Add 3 passes
  9. For each of them (a, b, c), select and set their type to :
    1. Clear pass (10a). No need to edit it after adding it, default is clear pass fitting our needs
    2. Scene pass (10b). Select the Scene pass type, leave its default settings, it will pick the render queue our sphere is in, by default
    3. Post process pass (10c). Select the Post pass type. In the settings group popping on the right (11c), change the shader to be the one rendering the environment, and tick the back process box to make it render behind the mesh

Once setup, we need to tell our editor to use the compositor for its active rendering. For this we need to access the rendering logic window through the Scene menu :

Scene Menu
Changing the logic goes through this sub menu

we are greeted by this window, in which we need to change the active compositor and select the one we created in the list :

Compositor change interface
In this interface, change the active compositor

Once this is done, the rendering window should reflect that :

Final image
Complete rendering override

Saving our work for later

Now that we've done all of this and are satisfied with the way it looks, it would be wise to save it to be able to reopen and iterate over it quickly.

In the main window menu : File > Save.
In the browser created, find the folder in which you want to save, and validate. The saving process will create some files and folders, it is advised to save in a dedicated folder to keep structure clear.

And that's it ! The current setup is saved. Now, when opening the editor again, just hit File > Load and open back the project file saved. The scene will be loaded again and you will be able to start from where you left off.

To conclude

As you probably noted, the editor is driving the API in a very transparent way. The steps closely emulate the lines of code we would need to write in a program.
However, as low level this can seem, this allows big control over the rendering we want. The UI on top of that adds a lot of comfort, and allows to prototype and iterate way quicker than during a standard C++ application development process.

And with that, the editor still has some functionalities we have to unveil. Isn't that pretty ?